Optimal Decay Rate of Connection Weights in Covariance Learning
نویسنده
چکیده
| Associative memory cannot store items more than its memory capacity. When new items are given one after another, connection weights should be decayed so as not to exceed the memory capacity. This report presents the optimal decay rate that maximizes the number of stored items. 1. $O $8 $a $K 5-21MFNL$O5-21%b%G%k$H$7$F$N%K%e!<%i%k%M%C%H%o!<%/$NG=NO$r$O$+$kL\0B$H$J$k$P$+$j$G$J$/, %M%C%H%o!< %/$NHF2=G=NO$H$b4X78$7$F$*$j=EMW$JLdBj$H$J$C$F$$$k. $^$?, O"A[5-21$G=PNO%Q%?!<%s$r%9%Q!<%9Id9f2=$7 $/$H5-21MFNL$,Hs>o$KBg$-$/$J$k$3$H$,CN$i$l$F$*$jCmL\$5$l$F$$$k [1]. $7$+$7$J$,$i, %K%e!<%i%k%M%C%H%o!<%/<+?H$O5-21$7$F$$$k%Q%?!<%s$N8D?t$r3P$($F$$$k$o$1$G$O$J$/, 5-21MF NL$r1[$($?$+$I$&$+$rH=Dj$G$-$J$$. =>$C$F, Dj$9$k). (1) <0$N1&JUBh; 1 9‘$O7k9g2Y=E $NCM$,H/;6$7$J$$$h$&$KJX59E*$KF~$l$i$l$?8:?j9‘$G$"$k$,, $b$C$H@Q6KE*$K2r$C$FLdBj$O, 5-21MFNL M $r:GBg$K$9$k$h$&$J8:?jN( a $*$h$S$=$N$H$-$N M $N $r5a$a$k$3$H$G$"$k. 3. ZL@$O>JN,) Theorem :GE,$J8:?jN( a opt $O n $,==J,Bg$-$$;~,
منابع مشابه
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
We investigate the use of information from all second order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization and increase the speed of further training. Our method, Optimal Brain Surgeon (OBS), is significantly better than magnitude-based methods, which can often remove the wrong weights. ...
متن کاملMinimax optimal estimation of general bandable covariance matrices
Cai et al. (2010) [4] have studied the minimax optimal estimation of a collection of large bandable covariance matrices whose off-diagonal entries decay to zero at a polynomial rate. They have shown that the minimax optimal procedures are fundamentally different under Frobenius and spectral norms, regardless of the rate of polynomial decay. To gain more insight into this interesting problem, we...
متن کاملOutlier Detection Using Extreme Learning Machines Based on Quantum Fuzzy C-Means
One of the most important concerns of a data miner is always to have accurate and error-free data. Data that does not contain human errors and whose records are full and contain correct data. In this paper, a new learning model based on an extreme learning machine neural network is proposed for outlier detection. The function of neural networks depends on various parameters such as the structur...
متن کاملCapacity and Error Correction Ability of Sparsely Encoded Associative Memory with Forgetting Process
Associative memory model of neural networks can not store items more than its memory capacity. When new items are given one after another, its connection weights should be decayed so that the number of stored items does not exceed the memory capacity (forgetting process). This paper analyzes the sparsely encoded associative memory, and presents the optimal decay rate that maximizes the number o...
متن کاملSolving Fuzzy Equations Using Neural Nets with a New Learning Algorithm
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...
متن کاملSolving Fuzzy Equations Using Neural Nets with a New Learning Algorithm
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...
متن کامل